New reliability score for component strength using kullback-leibler divergence
نویسندگان
چکیده
منابع مشابه
Use of Kullback–Leibler divergence for forgetting
Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, us...
متن کاملRényi Divergence and Kullback-Leibler Divergence
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...
متن کاملKullback-Leibler Divergence for Nonnegative Matrix Factorization
The I-divergence or unnormalized generalization of KullbackLeibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by expl...
متن کاملA comparison of score-based methods for estimating Bayesian networks using the Kullback-Leibler divergence
In this paper, we compare the performance of two methods for estimating Bayesian networks from data containing exogenous variables and random effects. The first method is fully Bayesian in which a prior distribution is placed on the exogenous variables, whereas the second method, which we call the residual approach, accounts for the effects of exogenous variables by using the notion of restrict...
متن کاملVector Quantization by Minimizing Kullback-Leibler Divergence
This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Eksploatacja i Niezawodnosc - Maintenance and Reliability
سال: 2016
ISSN: 1507-2711
DOI: 10.17531/ein.2016.3.7